Dropout training for Hidden Unit CRFs

نویسندگان

  • Anirudh Vemula
  • Senthil Purushwalkam
  • Varun Joshi
چکیده

A very commonly faced issue while training prediction models using machine learning is overfitting. Dropout is a recently developed technique designed to counter this issue in deep neural networks and has also been extended to other algorithms like SVMs. In this project, we formulate and study the application of Dropout to Hidden Unit Conditional Random Fields (HUCRFs). HUCRFs use binary stochastic hidden units to model the underlying latent structure in the data and hence are similar to Neural Networks. The Dropout technique proposed in this report involves dropping of a random set of hidden states for each training example. For test inference, we formulate an inference equation that is equivalent to computing expected prediction using the ensemble of all Dropout models. In this report, we also evaluate our Dropout train and test formulations when used for OCR. We evaluate the models by varying complexity of the models and the amount of dropout. Results show that HUCRF models trained with dropout consistently perform better than baseline HUCRF.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Compacting Neural Network Classifiers via Dropout Training

We introduce dropout compaction, a novel method for training feed-forward neural networks which realizes the performance gains of training a large model with dropout regularization, yet extracts a compact neural network for run-time efficiency. In the proposed method, we introduce a sparsity-inducing prior on the per unit dropout retention probability so that the optimizer can effectively prune...

متن کامل

Hidden-Unit Conditional Random Fields

The paper explores a generalization of conditional random fields (CRFs) in which binary stochastic hidden units appear between the data and the labels. Hidden-unit CRFs are potentially more powerful than standard CRFs because they can represent nonlinear dependencies at each frame. The hidden units in these models also learn to discover latent distributed structure in the data that improves cla...

متن کامل

Fast Learning with Noise in Deep Neural Nets

Dropout has been raised as an effective and simple trick [1] to combat overfitting in deep neural nets. The idea is to randomly mask out input and internal units during training. Despite its usefulness, there has been very little and scattered understanding on injecting noise to deep learning architectures’ internal units. In this paper, we study the effect of dropout on both input and hidden l...

متن کامل

Pre-training of Hidden-Unit CRFs

In this paper, we apply the concept of pretraining to hidden-unit conditional random fields (HUCRFs) to enable learning on unlabeled data. We present a simple yet effective pre-training technique that learns to associate words with their clusters, which are obtained in an unsupervised manner. The learned parameters are then used to initialize the supervised learning process. We also propose a w...

متن کامل

Multilingual Training and Cross-lingual Adaptation on CTC-based Acoustic Model

Phoneme-based multilingual training and different crosslingual adaptation techniques for Automatic Speech Recognition (ASR) are explored in Connectionist Temporal Classification (CTC)-based systems. The multilingual model is trained to model a universal IPA-based phone set using CTC loss function. While the same IPA symbol may not correspond to acoustic similarity, Learning Hidden Unit Contribu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015